Iterative Programming of Noisy Memory Cells

نویسندگان

چکیده

In this paper, we study a model that mimics the programming operation of memory cells. This was first introduced by Lastras-Montano et al. for continuous-alphabet channels, and later Bunte Lapidoth discrete memoryless channels (DMC). Under paradigm assume cells are programmed sequentially individually. The process is modeled as transmission over channel, such it possible to read cell state in order determine its success, case failure, reprogram again. Reprogramming can reduce bit error rate, however comes with price increasing overall time thereby affecting writing speed memory. An xmlns:xlink="http://www.w3.org/1999/xlink">iterative scheme an algorithm which specifies number attempts program each cell. Given channel constraints on average maximum cell, schemes maximize bits be reliably stored We extend results problem when either discrete-input symmetric (including BSC,BEC, BI-AWGN) or $Z$ channel. For BSC BEC our analysis also extended where probabilities consecutive writes not necessarily same. Lastly, related motivated synthesis DNA molecules.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Iterative Concept Learning from Noisy Data Iterative Concept Learning from Noisy Data

In the present paper, we study iterative learning of indexable concept classes from noisy data. We distinguish between learning from positive data only and learning from positive and negative data; synonymously, learning from text and informant, respectively. Following 20], a noisy text (a noisy informant) for some target concept contains every correct data item innnitely often while in additio...

متن کامل

An iterative method for tri-level quadratic fractional programming problems using fuzzy goal programming approach

Tri-level optimization problems are optimization problems with three nested hierarchical structures, where in most cases conflicting objectives are set at each level of hierarchy. Such problems are common in management, engineering designs and in decision making situations in general, and are known to be strongly NP-hard. Existing solution methods lack universality in solving these types of pro...

متن کامل

Generalization Error Bounds for Noisy, Iterative Algorithms

In statistical learning theory, generalization error is used to quantify the degree to which a supervised machine learning algorithm may overfit to training data. Recent work [Xu and Raginsky (2017)] has established a bound on the generalization error of empirical risk minimization based on the mutual information I(S;W ) between the algorithm input S and the algorithm output W , when the loss f...

متن کامل

Iterative Learning with Open-set Noisy Labels

Large-scale datasets possessing clean label annotations are crucial for training Convolutional Neural Networks (CNNs). However, labeling large-scale data can be very costly and error-prone, and even high-quality datasets are likely to contain noisy (incorrect) labels. Existing works usually employ a closed-set assumption, whereby the samples associated with noisy labels possess a true class con...

متن کامل

A new iterative with memory class for solving nonlinear ‎equations‎

In this work we develop a new optimal without memory class for approximating a simple root of a nonlinear equation. This class includes three parameters. Therefore, we try to derive some with memory methods so that the convergence order increases as high as possible. Some numerical examples are also ‎presented.‎‎

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Communications

سال: 2022

ISSN: ['1558-0857', '0090-6778']

DOI: https://doi.org/10.1109/tcomm.2021.3130660